Llama3.1 MOE 4X8B Gated IQ Multi Tier COGITO Deep Reasoning 32B GGUF
Apache-2.0
A Mixture of Experts (MoE) model with adjustable reasoning capabilities, enhancing inference and text generation through collaboration of four 8B models
Large Language Model Supports Multiple Languages